Completing any low-rank matrix, provably
نویسندگان
چکیده
Matrix completion concerns the recovery of a low-rank matrix from a subset of its revealedentries, and nuclear norm minimization has emerged as an effective surrogate for this combina-torial problem. Here, we show that nuclear norm minimization can recover an arbitrary n × nmatrix of rank r from O(nr log(n)) revealed entries, provided that revealed entries are drawnproportionally to the local row and column coherences (closely related to leverage scores) of theunderlying matrix. Our results are order-optimal up to logarithmic factors, and extend existingresults for nuclear norm minimization which require strong incoherence conditions on the typesof matrices that can be recovered, due to assumed uniformly distributed revealed entries. Wefurther provide extensive numerical evidence that a proposed two-phase sampling algorithm canperform nearly as well as local-coherence sampling and without requiring a priori knowledge ofthe matrix coherence structure. Finally, we apply our results to quantify how weighted nuclearnorm minimization can improve on unweighted minimization given an arbitrary set of sampledentries.
منابع مشابه
Algorithms for $\ell_p$ Low-Rank Approximation
We consider the problem of approximating a given matrix by a low-rank matrix so as to minimize the entrywise `p-approximation error, for any p ≥ 1; the case p = 2 is the classical SVD problem. We obtain the first provably good approximation algorithms for this version of low-rank approximation that work for every value of p ≥ 1, including p = ∞. Our algorithms are simple, easy to implement, wor...
متن کاملAlgorithms for `p Low-Rank Approximation
We consider the problem of approximating a given matrix by a low-rank matrix so as to minimize the entry-wise `p-approximation error, for any p ≥ 1; the case p = 2 is the classical SVD problem. We obtain the first provably good approximation algorithms for this version of lowrank approximation that work for every value of p ≥ 1, including p =∞. Our algorithms are simple, easy to implement, work...
متن کاملRobust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices by Convex Optimization
Principal component analysis is a fundamental operation in computational data analysis, with myriad applications ranging from web search to bioinformatics to computer vision and image analysis. However, its performance and applicability in real scenarios are limited by a lack of robustness to outlying or corrupted observations. This paper considers the idealized “robust principal component anal...
متن کاملSharp analysis of low-rank kernel matrix approximations
We consider supervised learning problems within the positive-definite kernel framework, such as kernel ridge regression, kernel logistic regression or the support vector machine. With kernels leading to infinite-dimensional feature spaces, a common practical limiting difficulty is the necessity of computing the kernel matrix, which most frequently leads to algorithms with running time at least ...
متن کاملSketchy Decisions: Convex Low-Rank Matrix Optimization with Optimal Storage
This paper concerns a fundamental class of convex matrix optimization problems. It presents the first algorithm that uses optimal storage and provably computes a lowrank approximation of a solution. In particular, when all solutions have low rank, the algorithm converges to a solution. This algorithm, SketchyCGM, modifies a standard convex optimization scheme, the conditional gradient method, t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of Machine Learning Research
دوره 16 شماره
صفحات -
تاریخ انتشار 2015